What is lru-cache?
The lru-cache package is a JavaScript library that provides a cache object that deletes the least-recently-used items. It is useful for storing a limited amount of data in a way that allows for fast retrieval of entries based on keys.
What are lru-cache's main functionalities?
Creating a cache instance
This code sample demonstrates how to create a new LRU cache instance with a maximum of 500 items and a maximum age of one hour for each item.
{"const LRU = require('lru-cache');
const options = { max: 500, maxAge: 1000 * 60 * 60 };
const cache = new LRU(options);"}
Setting and getting cache items
This code sample shows how to set a value in the cache with a key and then retrieve that value using the same key.
{"const LRU = require('lru-cache');
const cache = new LRU();
cache.set('key', 'value');
const value = cache.get('key');"}
Checking if a key is in the cache
This code sample illustrates how to check if a key is present in the cache without updating the recent-ness or deleting it.
{"const LRU = require('lru-cache');
const cache = new LRU();
cache.set('key', 'value');
const hasKey = cache.has('key');"}
Deleting a key from the cache
This code sample shows how to delete a specific key from the cache.
{"const LRU = require('lru-cache');
const cache = new LRU();
cache.set('key', 'value');
cache.del('key');"}
Resetting the cache
This code sample demonstrates how to completely clear the cache.
{"const LRU = require('lru-cache');
const cache = new LRU();
cache.set('key', 'value');
cache.reset();"}
Other packages similar to lru-cache
node-cache
node-cache is an in-memory cache module for Node.js. It is similar to lru-cache but does not specifically implement the LRU (Least Recently Used) cache algorithm. Instead, it provides a simple caching mechanism with TTL (time to live) support.
cache-manager
cache-manager is a cache module that allows easy switching between different cache stores. It supports a variety of stores (e.g., memory, Redis, MongoDB) and includes LRU cache functionality. It is more flexible than lru-cache in terms of storage options but may be more complex to use.
quick-lru
quick-lru is an LRU cache implementation that is optimized for performance. It claims to be faster than lru-cache for certain use cases, especially when dealing with a large number of items or frequent evictions.
lru cache
A cache object that deletes the least-recently-used items.
Installation:
npm install lru-cache --save
Usage:
var LRU = require("lru-cache")
, options = { max: 500
, length: function (n, key) { return n * 2 + key.length }
, dispose: function (key, n) { n.close() }
, maxAge: 1000 * 60 * 60 }
, cache = LRU(options)
, otherCache = LRU(50)
cache.set("key", "value")
cache.get("key")
var someObject = { a: 1 }
cache.set(someObject, 'a value')
cache.set('[object Object]', 'a different value')
assert.equal(cache.get(someObject), 'a value')
assert.equal(cache.get({ a: 1 }), undefined)
cache.reset()
If you put more stuff in it, then items will fall out.
If you try to put an oversized thing in it, then it'll fall out right
away.
Options
max
The maximum size of the cache, checked by applying the length
function to all values in the cache. Not setting this is kind of
silly, since that's the whole purpose of this lib, but it defaults
to Infinity
.maxAge
Maximum age in ms. Items are not pro-actively pruned out
as they age, but if you try to get an item that is too old, it'll
drop it and return undefined instead of giving it to you.length
Function that is used to calculate the length of stored
items. If you're storing strings or buffers, then you probably want
to do something like function(n, key){return n.length}
. The default is
function(){return 1}
, which is fine if you want to store max
like-sized things. The item is passed as the first argument, and
the key is passed as the second argumnet.dispose
Function that is called on items when they are dropped
from the cache. This can be handy if you want to close file
descriptors or do other cleanup tasks when items are no longer
accessible. Called with key, value
. It's called before
actually removing the item from the internal cache, so if you want
to immediately put it back in, you'll have to do that in a
nextTick
or setTimeout
callback or it won't do anything.stale
By default, if you set a maxAge
, it'll only actually pull
stale items out of the cache when you get(key)
. (That is, it's
not pre-emptively doing a setTimeout
or anything.) If you set
stale:true
, it'll return the stale value before deleting it. If
you don't set this, then it'll return undefined
when you try to
get a stale entry, as if it had already been deleted.noDisposeOnSet
By default, if you set a dispose()
method, then
it'll be called whenever a set()
operation overwrites an existing
key. If you set this option, dispose()
will only be called when a
key falls out of the cache, not when it is overwritten.
API
-
set(key, value, maxAge)
-
get(key) => value
Both of these will update the "recently used"-ness of the key.
They do what you think. maxAge
is optional and overrides the
cache maxAge
option if provided.
If the key is not found, get()
will return undefined
.
The key and val can be any value.
-
peek(key)
Returns the key value (or undefined
if not found) without
updating the "recently used"-ness of the key.
(If you find yourself using this a lot, you might be using the
wrong sort of data structure, but there are some use cases where
it's handy.)
-
del(key)
Deletes a key out of the cache.
-
reset()
Clear the cache entirely, throwing away all values.
-
has(key)
Check if a key is in the cache, without updating the recent-ness
or deleting it for being stale.
-
forEach(function(value,key,cache), [thisp])
Just like Array.prototype.forEach
. Iterates over all the keys
in the cache, in order of recent-ness. (Ie, more recently used
items are iterated over first.)
-
rforEach(function(value,key,cache), [thisp])
The same as cache.forEach(...)
but items are iterated over in
reverse order. (ie, less recently used items are iterated over
first.)
-
keys()
Return an array of the keys in the cache.
-
values()
Return an array of the values in the cache.
-
length
Return total length of objects in cache taking into account
length
options function.
-
itemCount
Return total quantity of objects currently in cache. Note, that
stale
(see options) items are returned as part of this item
count.
-
dump()
Return an array of the cache entries ready for serialization and usage
with 'destinationCache.load(arr)`.
-
load(cacheEntriesArray)
Loads another cache entries array, obtained with sourceCache.dump()
,
into the cache. The destination cache is reset before loading new entries
-
prune()
Manually iterates over the entire cache proactively pruning old entries